49 resultados para 080101 Adaptive Agents and Intelligent Robotics

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many organizations struggle with the massive amount of data they collect. Today, data does more than serve as the ingredients for churning out statistical reports. They help support efficient operations in many organizations, and to some extent, data provide the competitive intelligence organizations need to survive in today's economy. Data mining can't always deliver timely and relevant results because data are constantly changing. However, stream-data processing might be more effective, judging by the Matrix project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research in pursuit of an effective response to the demands for a sustainable architecture has lead towards the conception of a Renewable, Adaptive, Recyclable and Environmental (R.A.R.E.) building typology. The term R.A.R.E. expresses issues that have assumed central importance in the current architectural debate. This paper establishes the principles of the typology, drawing on the contents and pedagogical methods applied in a building technology academic course, at fourth year level. The R.A.R.E methodology is presented to and explored by students in the search for a definition of an innovative architecture, which is both progressive and sustainable. The unit is structured into eight subjects: Sustainable Site & Climate Analysis; Flexible & Adaptive Structural Systems; Renewable & Environmental Building Materials; Modular Building Systems; Innovative Building Envelope Systems; Renewable & Non-conventional Energy Systems; Innovative Heating, Ventilation & Air Conditioning Systems; Water Collection & Storage Systems. Through a holistic and integrated approach, the unit presents a comprehensive overview of these ‘Sustainable Building Categories’, so that the students can produce a guide towards the design requirements of a Renewable, Adaptive, Recyclable and Environmental (R.A.R.E.) Architecture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper tells a story of synergism of two cutting edge technologies — agents and data mining. By integrating these two technologies, the power for each of them is enhanced. Integrating agents into data mining systems, or constructing data mining systems from agent perspectives, the flexibility of data mining systems can be greatly improved. New data mining techniques can add to the systems dynamically in the form of agents, while the out-of-date ones can also be deleted from systems at run-time. Equipping agents with data mining capabilities, the agents are much smarter and more adaptable. In this way, the performance of these agent systems can be improved. A new way to integrate these two techniques –ontology-based integration is also discussed. Case studies will be given to demonstrate such mutual enhancement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It was the end of the academic year in 2020. On the third floor of the Southern Health University's administrative building, the five Vice-Chancellors of the Health Universities were gathering. Outside it was raining, but the patter of drops on the window only served to enhance the coziness of the meeting room. This annual event would take three days. The Vice-Chancellors enjoyed this meeting. They seldom saw each other during the busy academic year and under Professor Hope Brightview's 6 years of chairmanship they had been able to work well together to formulate a rational and profitable distribution of teaching, research and consultation services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the protocol for a review and there is no abstract. The objectives are as follows:

To assess the effects of nurse-led titration of ACEIs, beta-adrenergic blocking agents and ARBs in patients with left ventricular systolic dysfunction in terms of safety and patient outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diel vertical migration (DVM) by zooplankton is a universal feature in all the World's oceans, as well as being common in freshwater environments. The normal pattern involves movement from shallow depths at night to greater depths during the day. For many herbivorous and omnivorous mesozooplankton that feed predominantly near the surface on phytoplankton and microzooplankton, minimising the risk of predation from fish seems to be the ultimate factor behind DVM. These migrants appear to use deep water as a dark daytime refuge where their probability of being detected and eaten is lower than if they remained near the surface. Associated with these vertical movements of mesozooplankton, predators at higher trophic levels, including invertebrates, fish, marine mammals, birds and reptiles, may modify their behaviour to optimise the exploitation of their vertically migrating prey. Recent advances in biotelemetry promise to allow the interaction between migrating zooplankton and diving air-breathing vertebrates to be explored in far more detail than hitherto.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The need for guiding model formulation of normative social systems in support of a digital ecosystem is introduced. Normative social systems improve the understanding of computational social processes in simulation and experimentation, and provide support for digital ecosystem developments. However, a successful simulation requires the appropriate implementation of a conceptual model. It is proposed that an heuristic formalism of agents, networks and environments, complements the conventional creative approach to model formulation by guiding the formulation of conceptual models via abstract components and facilitate interface with other components in a digital environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Heart failure is associated with high mortality and hospital readmissions. Beta-adrenergic blocking agents, angiotensin converting enzyme inhibitors (ACEIs), and angiotensin receptor blockers (ARBs) can improve survival and reduce hospital readmissions and are recommended as first-line therapy in the treatment of heart failure. Evidence has also shown that there is a dose-dependent relationship of these medications with patient outcomes. Despite this evidence, primary care physicians are reluctant to up-titrate these medications. New strategies aimed at facilitating this up-titration are warranted. Nurse-led titration (NLT) is one such strategy. OBJECTIVES: To assess the effects of NLT of beta-adrenergic blocking agents, ACEIs, and ARBs in patients with heart failure with reduced ejection fraction (HFrEF) in terms of safety and patient outcomes. SEARCH METHODS: We searched the Cochrane Central Register of Controlled Trials in the Cochrane Library (CENTRAL Issue 11 of 12, 19/12/2014), MEDLINE OVID (1946 to November week 3 2014), and EMBASE Classic and EMBASE OVID (1947 to 2014 week 50). We also searched reference lists of relevant primary studies, systematic reviews, clinical trial registries, and unpublished theses sources. We used no language restrictions. SELECTION CRITERIA: Randomised controlled trials (RCTs) comparing NLT of beta-adrenergic blocking agents, ACEIs, and/or ARBs comparing the optimisation of these medications by a nurse to optimisation by another health professional in patients with HFrEF. DATA COLLECTION AND ANALYSIS: Two review authors (AD & JC) independently assessed studies for eligibility and risk of bias. We contacted primary authors if we required additional information. We examined quality of evidence using the GRADE rating tool for RCTs. We analysed extracted data by risk ratio (RR) with 95% confidence interval (CI) for dichotomous data to measure effect sizes of intervention group compared with usual-care group. Meta-analyses used the fixed-effect Mantel-Haenszel method. We assessed heterogeneity between studies by Chi(2) and I(2). MAIN RESULTS: We included seven studies (1684 participants) in the review. One study enrolled participants from a residential care facility, and the other six studies from primary care and outpatient clinics. All-cause hospital admission data was available in four studies (556 participants). Participants in the NLT group experienced a lower rate of all-cause hospital admissions (RR 0.80, 95% CI 0.72 to 0.88, high-quality evidence) and fewer hospital admissions related to heart failure (RR 0.51, 95% CI 0.36 to 0.72, moderate-quality evidence) compared to the usual-care group. Six studies (902 participants) examined all-cause mortality. All-cause mortality was also lower in the NLT group (RR 0.66, 95% CI 0.48 to 0.92, moderate-quality evidence) compared to usual care. Approximately 27 deaths could be avoided for every 1000 people receiving NLT of beta-adrenergic blocking agents, ACEIs, and ARBs. Only three studies (370 participants) reported outcomes on all-cause and heart failure-related event-free survival. Participants in the NLT group were more likely to remain event free compared to participants in the usual-care group (RR 0.60, 95% CI 0.46 to 0.77, moderate-quality evidence). Five studies (966 participants) reported on the number of participants reaching target dose of beta-adrenergic blocking agents. This was also higher in the NLT group compared to usual care (RR 1.99, 95% CI 1.61 to 2.47, low-quality evidence). However, there was a substantial degree of heterogeneity in this pooled analysis. We rated the risk of bias in these studies as high mainly due to a lack of clarity regarding incomplete outcome data, lack of reporting on adverse events associated with the intervention, and the inability to blind participants and personnel. Participants in the NLT group reached maximal dose of beta-adrenergic blocking agents in half the time compared with participants in usual care. Two studies reported on adverse events; one of these studies stated there were no adverse events, and the other study found one adverse event but did not specify the type or severity of the adverse event. AUTHORS' CONCLUSIONS: Participants in the NLT group experienced fewer hospital admissions for any cause and an increase in survival and number of participants reaching target dose within a shorter time period. However, the quality of evidence regarding the proportion of participants reaching target dose was low and should be interpreted with caution. We found high-quality evidence supporting NLT as one strategy that may improve the optimisation of beta-adrenergic blocking agents resulting in a reduction in hospital admissions. Despite evidence of a dose-dependent relationship of beta-adrenergic blocking agents, ACEIs, and ARBs with improving outcomes in patients with HFrEF, the translation of this evidence into clinical practice is poor. NLT is one strategy that facilitates the implementation of this evidence into practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an experimental framework for a virtual reality artwork, Duet, that employs a combination of live, full body motion capture and Oculus Rift HMD to construct an experience through which a human User can spatially interact with an artificially intelligent Agent. The project explores conceptual notions of embodied knowledge transfer, shared poetics of movement and distortions of the body schema. Within this context, both the User and the Agent become performers, constructing an intimate and spontaneously generated proximal space. The project generates a visualization of the relationship between the User and the Agent without the context of a fixed VR landscape or architecture. The Agent's ability to retain and accumulate movement knowledge in a way that mimics human learning transforms an interactive experience into a collaborative one. The virtual representation of both performers is distorted and amplified in a dynamic manner, enhancing the potential for creative dialogue between the Agent and the User.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent emergence of intelligent agent technology and advances in information gathering have been the important steps forward in efficiently managing and using the vast amount of information now available on the Web to make informed decisions. There are, however, still many problems that need to be overcome in the information gathering research arena to enable the delivery of relevant information required by end users. Good decisions cannot be made without sufficient, timely, and correct information. Traditionally it is said that knowledge is power, however, nowadays sufficient, timely, and correct information is power. So gathering relevant information to meet user information needs is the crucial step for making good decisions. The ideal goal of information gathering is to obtain only the information that users need (no more and no less). However, the volume of information available, diversity formats of information, uncertainties of information, and distributed locations of information (e.g. World Wide Web) hinder the process of gathering the right information to meet the user needs. Specifically, two fundamental issues in regard to efficiency of information gathering are mismatch and overload. The mismatch means some information that meets user needs has not been gathered (or missed out), whereas, the overload means some gathered information is not what users need. Traditional information retrieval has been developed well in the past twenty years. The introduction of the Web has changed people's perceptions of information retrieval. Usually, the task of information retrieval is considered to have the function of leading the user to those documents that are relevant to his/her information needs. The similar function in information retrieval is to filter out the irrelevant documents (or called information filtering). Research into traditional information retrieval has provided many retrieval models and techniques to represent documents and queries. Nowadays, information is becoming highly distributed, and increasingly difficult to gather. On the other hand, people have found a lot of uncertainties that are contained in the user information needs. These motivate the need for research in agent-based information gathering. Agent-based information systems arise at this moment. In these kinds of systems, intelligent agents will get commitments from their users and act on the users behalf to gather the required information. They can easily retrieve the relevant information from highly distributed uncertain environments because of their merits of intelligent, autonomy and distribution. The current research for agent-based information gathering systems is divided into single agent gathering systems, and multi-agent gathering systems. In both research areas, there are still open problems to be solved so that agent-based information gathering systems can retrieve the uncertain information more effectively from the highly distributed environments. The aim of this thesis is to research the theoretical framework for intelligent agents to gather information from the Web. This research integrates the areas of information retrieval and intelligent agents. The specific research areas in this thesis are the development of an information filtering model for single agent systems, and the development of a dynamic belief model for information fusion for multi-agent systems. The research results are also supported by the construction of real information gathering agents (e.g., Job Agent) for the Internet to help users to gather useful information stored in Web sites. In such a framework, information gathering agents have abilities to describe (or learn) the user information needs, and act like users to retrieve, filter, and/or fuse the information. A rough set based information filtering model is developed to address the problem of overload. The new approach allows users to describe their information needs on user concept spaces rather than on document spaces, and it views a user information need as a rough set over the document space. The rough set decision theory is used to classify new documents into three regions: positive region, boundary region, and negative region. Two experiments are presented to verify this model, and it shows that the rough set based model provides an efficient approach to the overload problem. In this research, a dynamic belief model for information fusion in multi-agent environments is also developed. This model has a polynomial time complexity, and it has been proven that the fusion results are belief (mass) functions. By using this model, a collection fusion algorithm for information gathering agents is presented. The difficult problem for this research is the case where collections may be used by more than one agent. This algorithm, however, uses the technique of cooperation between agents, and provides a solution for this difficult problem in distributed information retrieval systems. This thesis presents the solutions to the theoretical problems in agent-based information gathering systems, including information filtering models, agent belief modeling, and collection fusions. It also presents solutions to some of the technical problems in agent-based information systems, such as document classification, the architecture for agent-based information gathering systems, and the decision in multiple agent environments. Such kinds of information gathering agents will gather relevant information from highly distributed uncertain environments.